AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Lightweight MoE

# Lightweight MoE

Phi Mini MoE Instruct GGUF
MIT
Phi-mini-MoE is a lightweight Mixture of Experts (MoE) model suitable for English business and research scenarios, excelling in resource-constrained environments and low-latency scenarios.
Large Language Model English
P
gabriellarson
2,458
1
Tinymixtral 4x248M MoE
Apache-2.0
TinyMixtral-4x248M-MoE is a small language model adopting the Mixture of Experts (MoE) architecture, formed by merging multiple TinyMistral variants, suitable for text generation tasks.
Large Language Model Transformers
T
Isotonic
1,310
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase